Présentation

La compagnie d’assurance canadienne AssurExperts Inc. https://www.assurexperts.qc.ca propose comme mission de définir une nouvelle stratégie publicitaire s’appuyant sur l’analyse des différents comportements des segments de clients, afin de détecter celui ou ceux qui paraissent les plus porteurs.

Notre mission consiste ainsi en deux tâches principales :

Notre premier objectif sera d’abord donc, d’en définir de ces tâches, les “data science goals” de notre étude.

Objectifs Data Science

Dans ce cas d’étude, pour achever nos objectifs métier, on sera donc amené à passer par une étude des données bien précises, procéder par de bon “feature engineering” pour le “profiling” de la clientèle et enfin mettre en oeuvre un modèle opérant sur la définition des variables/attributs les plus importants pour notre cas d’étude pour classifier/déterminer les clients potentiels, ceci, peut être addressé par une bonne méthode de scoring clientel pour déterminer ce qu’on a définit en terme de métier par le “degré d’appétence”. Pour résumer ainsi, on se cèdera bien sûr de la méthode CRISP dans notre étude pour appliquer nos objectifs:

Chargement et compréhension des données

raw_data <- read.delim("AssurancExpertsInc.txt")
Visualisation de quelques colonnes pour une meilleure lisibilité
SD1 SD2 SD3 SD4 SD5 SD6 SD7 SD8 SD9 SD10 SD11 SD12 SD13 SD14 SD15
33 1 3 2 8 0 5 1 3 7 0 2 1 2 6
37 1 2 2 8 1 4 1 4 6 2 2 0 4 5
37 1 2 2 8 0 4 2 4 3 2 4 4 4 2
9 1 3 3 3 2 3 2 4 5 2 2 2 3 4
40 1 4 2 10 1 4 1 4 7 1 2 2 4 4


Les données contiennent une gamme d’informations sur les clients, notamment le revenu, la tranche d’âge, propriété du véhicule, nombre de polices souscrites et niveau des cotisations (primes) versées ainsi que plus d’informations qualitatives sur le mode de vie et le type de ménage.

Avant toute analyse complémentaire, on va ignorer la variable ‘STATUS’ qui définit pour quelle finalité est adressée chaque observation, ceci est fait pour qu’après, en phase de modélisation, on procède par la méthode la plus appropriée pour ce tas de données, qui est le rééchantillonage et la validation croisée pour obtenir une meilleure précision dans notre modèle surtout qu’on essaiera de créer notre propre modèle de classification en empilant différents modèles et peaufinant leurs paramètres.

On procèdera aussi tout au long de notre étude par le principe de l’immutabilité des variables pour qu’on garantit la non-perte des données faute d’attention ou par mauvaise manipulation.

prepared_data <- raw_data[1:ncol(raw_data)-1]
Visualisation de quelques colonnes pour une meilleure lisibilité
PO71 PO72 PO73 PO74 PO75 PO76 PO77 PO78 PO79 PO80 PO81 PO82 PO83 PO84 PO85 CLASS
9818 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 No
9819 0 0 0 0 0 1 0 0 0 1 0 0 0 0 0 Yes
9820 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 No
9821 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 No
9822 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 No


Maintenant, pour une meilleure visualisation des données par objectif de compréhension, on s’est permis la modification des données et le ‘décodage’ des cases de la trâme pour qu’on puisse obtenir une meilleure visualisation au niveau des graphes obtenus.


9239 On repondu par Non et 568 on accepté d’avoir une assurance sur caravane Il est clair que notre ensemble de données est très déséquilibré, avec seulement 5,9% des observations achetant réellement l’assurance. Avant de lancer notre analyse et de traiter les données non équilibrées, comprenons les caractéristiques des observations qui ont réellement acheté «l’assurance pour habitations mobiles». Comme il s’agit d’un problème de vente croisée, nous nous concentrerons davantage sur la compréhension des clients existants qui achètent généralement l’assurance.


Dans le graphique à secteurs ci-dessus, les clients de divers types d’environ 10 étiquettes sont pris. Les clients appartenant à maintype 8 (familles avec adultes) et à maintype 2 (producteurs motivés) sont plus susceptibles d’acquérir la police Caravan.






On peur remarquer d’apres ce graph que les clients ouvriers talentueux avec une pourcentage de 11-23% et 24-36% et meme les non talentueux ouvriers tend a avoir une assurance sur caravane.



Dans la partie ci-dessus, nous en venons à savoir que les clients qui paient une prime de police moyenne allant de 1 000 $ à 4 499 $ (6) sont plus susceptibles d’acquérir la police de caravane.



Dans le graphique à barres ci-dessus, nous en venons à savoir que les clients qui n’ont pas souscrit de police d’assurance sociale (0) sont plus susceptibles d’acquérir la police Caravan.



Dans le graphique à barres ci-dessus, des clients de différents groupes d’âge sont pris et comparés aux clients qui ont dit oui à l’assurance sur d’achat de caravanes. Les clients appartenant au groupe d’âge des 40 à 50 ans sont plus susceptibles d’acquérir la police caravane.



Dans la partie ci-dessus, nous en venons à savoir que les clients qui une education moyenne sont plus susceptibles d’acheter l’assurance sur Caravane.



Dans la partie ci-dessus, nous en venons à savoir que les clients qui ont un revenu>30K sont plus susceptibles d’acheter l’assurance sur Caravane.



Dans la partie ci-dessus, nous en venons à savoir que les clients qui achètent une seule politique incendie sont plus susceptibles d’acheter l’assurance sur Caravane.



On peut observer que les clients actuels des polices de voiture et des polices de feu ont également tendance à acheter «l’assurance pour Caravane». De plus, la population de la classe de réussite est généralement lower_level_education et Income_30K. Examinons plus de fonctionnalités de la classe de réussite.



Les deux graphiques ci-dessus sont basés sur les caractéristiques indiquées comme importants de notre analyse.

Préparation des données (avec du retour en arrière pour la compréhension des données, CRISP appliqué)

Un premier examen des données a révélé que, pour les paramètres de religion, ces clients qui sont catholiques romains sont ensuite subdivisés en 10 niveaux différents avec une gamme de valeurs en pourcentage. Malgré diverses tentatives pour obtenir plus d’informations sur que représente la valeur en pourcentage en fait pour un paramètre qui aurait normalement une simple valeur binaire, c’est-à-dire si quelqu’un est catholique ou non, nous avons été incapables de trouver une crédible explication. Compte tenu de ce manque de clarté, il a été décidé d’exclure les quatre domaines liés à la religion. Cependant, avant ce faisant, nous avons effectué une analyse de corrélation pour nous assurer que ces quatre paramètres de religion ne sont pas en corrélation significative avec les détenteurs d’assurance caravane. Pour la corrélation entre la religion des clients associée à Caravan Insurance, la commande R suivante a été utilisé:

cor(prepared_data[1:85], prepared_data$CLASS)


Pour les variables concernant les religions des clients de l’assurance étudiée.

##             [,1]
## SD6 -0.014993367
## SD7 -0.022810972
## SD8 -0.006226458
## SD9  0.035085999


Comme on peut le voir dans ce tableau de corrélation, les corrélations entre la religion des clients et la détention de l’assurance sur caravane ne sont pas très significatives. Par conséquent, sur la base de cela, les quatre paramètres de religion ont été exclus de l’analyse plus détaillée présentée ici. Cela a réduit le nombre total de variables à 82.


Avant d’explorer de plus en plus l’ensemble de données, chacune des quatre variables catégoriques restantes a été «déployée» (c’est-à-dire convertie en une variable binaire). Par exemple, la première variable catégorielle, MOSTYPE, qui définit le «sous-type de client» comportait 41 niveaux, tels que «carrière et garde d’enfants» ou «famille de classe moyenne». Chaque sous-type est traité comme une variable distincte avec une valeur binaire indiquant si un client appartient à ce sous-type ou non. Cela a augmenté le nombre total de variables de 82 à 155 (sans compter les 5 remplacées).

# Renaming the factorial columns so then we can unfold them without harming the other colunmns
colnames(prepared_data)[colnames(prepared_data)=="SD1"] <- "MOSTYPE"
colnames(prepared_data)[colnames(prepared_data)=="SD2"] <- "MAANTHUI"
colnames(prepared_data)[colnames(prepared_data)=="SD3"] <- "MGEMOMV"
colnames(prepared_data)[colnames(prepared_data)=="SD4"] <- "MGEMLEEF"
colnames(prepared_data)[colnames(prepared_data)=="SD5"] <- "MOSHOOFD"

# Factoring studied variables
prepared_data$MOSTYPE <- as.factor(prepared_data$MOSTYPE)
prepared_data$MAANTHUI <- as.factor(prepared_data$MAANTHUI)
prepared_data$MGEMOMV <- as.factor(prepared_data$MGEMOMV)
prepared_data$MGEMLEEF <- as.factor(prepared_data$MGEMLEEF)
prepared_data$MOSHOOFD <- as.factor(prepared_data$MOSHOOFD)

# Fixing missing levels
levels(prepared_data$MOSTYPE) <- c(levels(prepared_data$MOSTYPE), "14")
levels(prepared_data$MAANTHUI) <- c(levels(prepared_data$MAANTHUI), "9")

# unfolding and binding variables to dataframe
m1 <- model.matrix( ~ MOSTYPE - 1, data=prepared_data )
m2 <- model.matrix( ~ MAANTHUI - 1, data=prepared_data )
m3 <- model.matrix( ~ MGEMOMV - 1, data=prepared_data )
m4 <- model.matrix( ~ MGEMLEEF - 1, data=prepared_data )
m5 <- model.matrix( ~ MOSHOOFD - 1, data=prepared_data )
m_unfolded <- cbind(m1,m2)
m_unfolded <- cbind(m_unfolded, m3)
m_unfolded <- cbind(m_unfolded, m4)
m_unfolded <- cbind(m_unfolded, m5)
prepared_data <- cbind(prepared_data, m_unfolded)

On obtient ainsi un tel résultat pour notre trâme résultante:
Visualisation de quelques colonnes pour une meilleure lisibilité
MGEMLEEF4 MGEMLEEF5 MGEMLEEF6 MOSHOOFD1 MOSHOOFD2 MOSHOOFD3 MOSHOOFD4 MOSHOOFD5
9818 1 0 0 0 0 0 0 0
9819 0 0 0 0 0 0 0 1
9820 0 0 0 0 0 0 0 0
9821 0 0 0 0 0 0 0 0
9822 0 0 0 0 1 0 0 0


L’ensemble de données de formation avec 145 variables a été soumis à une analyse exploratoire plus poussée des données. La commande R suivante a été utilisée pour obtenir un aperçu de la valeur de chaque variable de profil client: Encore pour une meilleure lisibilité, on ne va inclure dans ce rapport qu’une partie de la description obtenue:

describe(prepared_data)
## prepared_data[1:2] 
## 
##  2  Variables      9822  Observations
## ---------------------------------------------------------------------------
## MOSTYPE 
##        n  missing distinct 
##     9822        0       40 
## 
## lowest : 1  2  3  4  5 , highest: 37 38 39 40 41
## ---------------------------------------------------------------------------
## MAANTHUI 
##        n  missing distinct 
##     9822        0        9 
##                                                                 
## Value          1     2     3     4     5     6     7     8    10
## Frequency   8915   821    64     4     3     3     8     2     2
## Proportion 0.908 0.084 0.007 0.000 0.000 0.000 0.001 0.000 0.000
## ---------------------------------------------------------------------------


Cette analyse a fourni des informations détaillées sur la nature des données du profil du client. En particulier, il indique clairement le nombre de clients entrant dans chaque sous-type des quatre principales variables catégorielles. Par exemple, dans le tableau détaillé, il est clair que la répartition des clients dans chaque sous-type de client n’est pas uniforme. Il est intéressant de noter qu’il n’ya pas de clients appartenant au sous-type «Junior Cosmopolitan» ou «Students in apartment», ce qui est quelque peu corroboré par le nombre relativement très bas de clients du groupe d’âge «20-30 ans». Pour voir s’il existe une relation particulièrement forte entre les paramètres du client et d’assurance caravane, nous avons décidé d’examiner la proportion de détenteurs d’assurance caravane par rapport à chacune des variables restantes (indépendantes). La commande R suivante a été utilisée pour cette analyse:

for (i in colnames(prepared_data))
  print(prop.table(table(prepared_data[,i], prepared_data$CLASS), 1))
## Pour la taille moyenne des ménages
##    
##                0            1
##   1 0.0011199348 0.0448992059
##   2 0.0198533903 0.3482997353
##   3 0.0279983710 0.4314803502
##   4 0.0099776013 0.1052738750
##   5 0.0007126858 0.0100794136
##   6 0.0000000000 0.0003054368
## Pour l'age moyen
##    
##                0            1
##   1 0.0001018123 0.0104866626
##   2 0.0158827123 0.2293830177
##   3 0.0308491142 0.4938912645
##   4 0.0106902871 0.1702300957
##   5 0.0020362452 0.0314599878
##   6 0.0001018123 0.0048869884


Les résultats ne signifiaient aucun autre lien entre l’assurance de la caravane et toute autre variable spécifique. Dans l’ensemble, les résultats ont également confirmé le nombre relativement faible de titulaires de polices d’assurance caravane dans la base de données. Après avoir exploré les dépendances entre les variables, nous avons effectué une analyse de régression logistique linéaire comme suit:

prepared_data <- viz_data
prepared_data$CLASS <- as.integer(prepared_data$CLASS)
prepared_data[which(prepared_data$CLASS == 1),]$CLASS <- 0
prepared_data[which(prepared_data$CLASS == 2),]$CLASS <- 1
cust.logit <- glm(formula = CLASS~., data = prepared_data)
options(max.print=1000000)
summary(cust.logit)
## 
## Call:
## glm(formula = CLASS ~ ., data = prepared_data)
## 
## Deviance Residuals: 
##      Min        1Q    Median        3Q       Max  
## -0.57613  -0.09123  -0.04034   0.00624   1.04092  
## 
## Coefficients: (21 not defined because of singularities)
##                                                      Estimate Std. Error
## (Intercept)                                        -8.097e-01  6.148e-01
## MOSTYPE.Affluent young families                     9.177e-02  4.748e-02
## MOSTYPE.Career and childcare                        1.141e-01  5.783e-02
## MOSTYPE.Couples with teens 'Married with children'  2.248e-01  9.278e-02
## MOSTYPE.Dinki's (Double income no kids)             4.802e-05  4.196e-02
## MOSTYPE.Ethnically diverse                          1.871e-01  1.108e-01
## MOSTYPE.Family starters                             1.970e-02  3.320e-02
## MOSTYPE.Fresh masters in the city                  -3.548e-03  9.812e-02
## MOSTYPE.High Income, expensive child                1.069e-01  5.788e-02
## MOSTYPE.High status seniors                         4.822e-02  3.154e-02
## MOSTYPE.Large family farms                          1.646e-01  9.752e-02
## MOSTYPE.Large family,employed child                 3.782e-02  3.469e-02
## MOSTYPE.Large religous families                     1.478e-01  8.153e-02
## MOSTYPE.Low income catholics                        1.698e-01  1.166e-01
## MOSTYPE.Lower class large families                  2.236e-01  9.221e-02
## MOSTYPE.Middle class families                       5.937e-02  4.443e-02
## MOSTYPE.Mixed apartment dwellers                    1.584e-01  1.063e-01
## MOSTYPE.Mixed rurals                                1.949e-01  8.910e-02
## MOSTYPE.Mixed seniors                               1.653e-01  9.601e-02
## MOSTYPE.Mixed seniors2                              1.728e-01  1.164e-01
## MOSTYPE.Mixed small town dwellers                   2.285e-01  8.833e-02
## MOSTYPE.Modern, complete families                   1.629e-01  8.851e-02
## MOSTYPE.Own home elderly                            1.710e-01  1.201e-01
## MOSTYPE.Porchless seniors: no front yard            2.231e-01  9.455e-02
## MOSTYPE.Religious elderly singles                   1.804e-01  1.085e-01
## MOSTYPE.Residential elderly                         7.543e-02  1.253e-01
## MOSTYPE.Senior cosmopolitans                        1.397e-01  1.510e-01
## MOSTYPE.Seniors in apartments                       1.343e-01  1.198e-01
## MOSTYPE.Single youth                                1.581e-01  1.126e-01
## MOSTYPE.Stable family                               8.448e-02  5.571e-02
## MOSTYPE.Students in apartments                      1.337e-01  1.291e-01
## MOSTYPE.Suburban youth                              1.610e-01  1.277e-01
## MOSTYPE.Traditional families                        2.092e-01  8.861e-02
## MOSTYPE.Very Important Provincials                  2.877e-02  3.995e-02
## MOSTYPE.Village families                            1.262e-01  8.127e-02
## MOSTYPE.Yound seniros in the city                   1.886e-01  1.168e-01
## MOSTYPE.Young all american family                   2.975e-02  3.366e-02
## MOSTYPE.Young and rising                            1.725e-01  9.287e-02
## MOSTYPE.Young urban have-nots                       1.776e-01  1.217e-01
## MOSTYPE.Young, low educated                         1.765e-01  1.076e-01
## MAANTHUI                                            9.176e-03  7.617e-03
## MGEMOMV                                             2.870e-03  6.750e-03
## MGEMLEEF30-40 years                                 2.549e-02  2.839e-02
## MGEMLEEF40-50 years                                 2.312e-02  2.848e-02
## MGEMLEEF50-60 years                                 2.429e-02  2.946e-02
## MGEMLEEF60-70 years                                 2.400e-02  3.310e-02
## MGEMLEEF70-80 years                                 2.630e-02  5.223e-02
## MOSHOOFDCareer Loners                                      NA         NA
## MOSHOOFDConservatie Families                               NA         NA
## MOSHOOFDCruising Seniors                                   NA         NA
## MOSHOOFDDriven Growers                                     NA         NA
## MOSHOOFDFamily with grown ups                              NA         NA
## MOSHOOFDFarmers                                            NA         NA
## MOSHOOFDLiving well                                        NA         NA
## MOSHOOFDRetired and Religious                              NA         NA
## MOSHOOFDSuccessful hedonists                               NA         NA
## MGODRK1-10%                                         2.154e-02  8.754e-03
## MGODRK100%                                          1.901e-02  9.722e-02
## MGODRK11-23%                                        1.011e-02  1.240e-02
## MGODRK24-36%                                       -2.505e-03  2.291e-02
## MGODRK37-49%                                       -6.070e-03  3.339e-02
## MGODRK50-62%                                        6.011e-03  5.552e-02
## MGODRK63-75%                                        6.394e-02  6.242e-02
## MGODRK76-88%                                        7.327e-02  8.523e-02
## MGODRK89-99%                                        1.481e-02  1.455e-01
## MGODPR1-10%                                        -3.425e-03  3.390e-02
## MGODPR100%                                          6.437e-02  6.245e-02
## MGODPR11-23%                                        4.171e-02  3.211e-02
## MGODPR24-36%                                        3.469e-02  3.327e-02
## MGODPR37-49%                                        3.442e-02  3.628e-02
## MGODPR50-62%                                        5.577e-02  4.088e-02
## MGODPR63-75%                                        5.001e-02  4.454e-02
## MGODPR76-88%                                        8.017e-02  5.079e-02
## MGODPR89-99%                                        8.129e-02  6.038e-02
## MGODOV1-10%                                        -1.038e-02  9.062e-03
## MGODOV11-23%                                        9.396e-03  1.267e-02
## MGODOV24-36%                                        2.665e-02  2.046e-02
## MGODOV37-49%                                        1.292e-02  2.960e-02
## MGODOV50-62%                                        7.086e-02  5.500e-02
## MGODGE1-10%                                         2.804e-02  2.252e-02
## MGODGE100%                                          7.206e-02  9.053e-02
## MGODGE11-23%                                        2.079e-02  1.912e-02
## MGODGE24-36%                                        1.835e-02  2.356e-02
## MGODGE37-49%                                        2.384e-02  2.752e-02
## MGODGE50-62%                                        3.119e-02  3.348e-02
## MGODGE63-75%                                        8.554e-03  4.074e-02
## MGODGE76-88%                                        2.035e-02  4.911e-02
## MGODGE89-99%                                        3.625e-02  8.748e-02
## MRELGE1-10%                                         1.025e-02  5.172e-02
## MRELGE100%                                          7.439e-02  8.353e-02
## MRELGE11-23%                                       -1.948e-02  4.769e-02
## MRELGE24-36%                                       -1.458e-03  5.160e-02
## MRELGE37-49%                                       -2.064e-04  5.530e-02
## MRELGE50-62%                                        1.668e-02  6.198e-02
## MRELGE63-75%                                        2.957e-02  6.763e-02
## MRELGE76-88%                                        3.405e-02  7.227e-02
## MRELGE89-99%                                        6.550e-02  7.804e-02
## MRELSA1-10%                                        -7.343e-03  9.423e-03
## MRELSA11-23%                                        5.783e-03  1.547e-02
## MRELSA24-36%                                        1.409e-02  2.865e-02
## MRELSA37-49%                                        2.817e-02  3.915e-02
## MRELSA50-62%                                        1.499e-02  6.598e-02
## MRELSA63-75%                                        4.515e-03  9.266e-02
## MRELSA76-88%                                        2.562e-02  1.761e-01
## MRELOV1-10%                                         1.988e-02  1.956e-02
## MRELOV100%                                          1.232e-01  1.088e-01
## MRELOV11-23%                                        4.115e-02  2.084e-02
## MRELOV24-36%                                        4.925e-02  2.606e-02
## MRELOV37-49%                                        4.491e-02  3.426e-02
## MRELOV50-62%                                        4.727e-02  4.491e-02
## MRELOV63-75%                                        3.220e-02  5.426e-02
## MRELOV76-88%                                        2.657e-02  6.814e-02
## MRELOV89-99%                                        4.987e-02  9.373e-02
## MFALLEEN1-10%                                      -5.933e-03  1.180e-02
## MFALLEEN100%                                       -5.562e-02  8.061e-02
## MFALLEEN11-23%                                     -1.897e-02  1.439e-02
## MFALLEEN24-36%                                     -1.617e-02  2.025e-02
## MFALLEEN37-49%                                     -2.213e-02  2.763e-02
## MFALLEEN50-62%                                     -7.763e-03  3.531e-02
## MFALLEEN63-75%                                     -2.854e-02  4.356e-02
## MFALLEEN76-88%                                     -5.498e-02  5.667e-02
## MFALLEEN89-99%                                     -6.597e-03  7.265e-02
## MFGEKIND1-10%                                      -4.468e-04  2.190e-02
## MFGEKIND100%                                       -5.337e-02  8.498e-02
## MFGEKIND11-23%                                      1.042e-02  2.208e-02
## MFGEKIND24-36%                                     -9.569e-03  2.537e-02
## MFGEKIND37-49%                                     -1.225e-02  2.976e-02
## MFGEKIND50-62%                                     -3.649e-02  3.562e-02
## MFGEKIND63-75%                                     -6.335e-02  4.247e-02
## MFGEKIND76-88%                                     -3.507e-02  5.211e-02
## MFGEKIND89-99%                                     -4.521e-02  7.460e-02
## MFWEKIND1-10%                                      -1.677e-02  2.770e-02
## MFWEKIND100%                                       -7.261e-02  6.458e-02
## MFWEKIND11-23%                                      2.008e-03  2.921e-02
## MFWEKIND24-36%                                     -1.585e-02  3.252e-02
## MFWEKIND37-49%                                     -2.842e-02  3.711e-02
## MFWEKIND50-62%                                     -4.570e-02  4.084e-02
## MFWEKIND63-75%                                     -5.006e-02  4.673e-02
## MFWEKIND76-88%                                     -4.535e-02  5.257e-02
## MFWEKIND89-99%                                     -5.117e-02  5.982e-02
## MOPLHOOG1-10%                                       2.046e-02  1.037e-02
## MOPLHOOG100%                                       -1.399e-02  9.112e-02
## MOPLHOOG11-23%                                      1.641e-02  1.424e-02
## MOPLHOOG24-36%                                      8.367e-04  2.157e-02
## MOPLHOOG37-49%                                      1.498e-02  2.797e-02
## MOPLHOOG50-62%                                      3.714e-02  3.747e-02
## MOPLHOOG63-75%                                      5.515e-02  5.027e-02
## MOPLHOOG76-88%                                      3.224e-02  6.091e-02
## MOPLHOOG89-99%                                     -7.197e-02  7.458e-02
## MOPLMIDD1-10%                                       2.815e-02  2.882e-02
## MOPLMIDD100%                                       -7.116e-03  7.575e-02
## MOPLMIDD11-23%                                     -3.412e-03  2.695e-02
## MOPLMIDD24-36%                                     -1.146e-02  3.118e-02
## MOPLMIDD37-49%                                     -5.199e-03  3.493e-02
## MOPLMIDD50-62%                                     -8.525e-04  4.012e-02
## MOPLMIDD63-75%                                     -5.203e-03  4.723e-02
## MOPLMIDD76-88%                                      2.000e-02  5.608e-02
## MOPLMIDD89-99%                                      2.566e-03  7.122e-02
## MOPLLAAG1-10%                                      -1.559e-02  2.595e-02
## MOPLLAAG100%                                       -6.046e-02  6.912e-02
## MOPLLAAG11-23%                                     -3.113e-02  2.620e-02
## MOPLLAAG24-36%                                     -3.617e-02  2.977e-02
## MOPLLAAG37-49%                                     -3.547e-02  3.520e-02
## MOPLLAAG50-62%                                     -3.769e-02  4.080e-02
## MOPLLAAG63-75%                                     -4.340e-02  4.721e-02
## MOPLLAAG76-88%                                     -6.211e-02  5.487e-02
## MOPLLAAG89-99%                                     -6.686e-02  6.416e-02
## MBERHOOG1-10%                                       7.911e-04  1.285e-02
## MBERHOOG100%                                        1.061e-01  6.741e-02
## MBERHOOG11-23%                                      1.565e-02  1.362e-02
## MBERHOOG24-36%                                      2.040e-02  1.772e-02
## MBERHOOG37-49%                                      3.059e-02  2.327e-02
## MBERHOOG50-62%                                      1.605e-02  2.851e-02
## MBERHOOG63-75%                                      1.578e-02  3.760e-02
## MBERHOOG76-88%                                      2.617e-02  4.729e-02
## MBERHOOG89-99%                                      1.336e-01  7.178e-02
## MBERZELF1-10%                                       9.855e-03  9.265e-03
## MBERZELF11-23%                                      1.730e-02  1.476e-02
## MBERZELF24-36%                                      3.423e-02  3.615e-02
## MBERZELF37-49%                                      4.551e-02  7.006e-02
## MBERZELF50-62%                                      8.657e-02  5.615e-02
## MBERBOER1-10%                                      -3.603e-03  9.442e-03
## MBERBOER100%                                        3.878e-02  9.236e-02
## MBERBOER11-23%                                     -1.704e-03  1.398e-02
## MBERBOER24-36%                                      1.473e-02  2.435e-02
## MBERBOER37-49%                                      7.798e-03  3.519e-02
## MBERBOER50-62%                                      3.316e-02  4.057e-02
## MBERBOER63-75%                                      3.964e-02  7.094e-02
## MBERBOER76-88%                                      4.266e-02  1.459e-01
## MBERBOER89-99%                                      8.104e-02  9.294e-02
## MBERMIDD1-10%                                      -5.179e-03  1.672e-02
## MBERMIDD100%                                        8.598e-02  5.228e-02
## MBERMIDD11-23%                                      8.619e-03  1.545e-02
## MBERMIDD24-36%                                      2.394e-03  1.882e-02
## MBERMIDD37-49%                                      3.297e-02  2.309e-02
## MBERMIDD50-62%                                      2.548e-02  2.862e-02
## MBERMIDD63-75%                                      6.360e-02  3.388e-02
## MBERMIDD76-88%                                      9.984e-02  4.156e-02
## MBERMIDD89-99%                                     -7.989e-02  6.678e-02
## MBERARBG1-10%                                       1.309e-02  1.165e-02
## MBERARBG100%                                        6.474e-02  6.935e-02
## MBERARBG11-23%                                      2.500e-02  1.309e-02
## MBERARBG24-36%                                      2.726e-02  1.701e-02
## MBERARBG37-49%                                      3.638e-02  2.190e-02
## MBERARBG50-62%                                      5.021e-02  2.752e-02
## MBERARBG63-75%                                      4.066e-02  3.370e-02
## MBERARBG76-88%                                      8.831e-02  4.487e-02
## MBERARBG89-99%                                      1.089e-01  5.942e-02
## MBERARBO1-10%                                       2.292e-03  1.214e-02
## MBERARBO100%                                        7.328e-02  6.582e-02
## MBERARBO11-23%                                      1.048e-02  1.351e-02
## MBERARBO24-36%                                      2.429e-02  1.692e-02
## MBERARBO37-49%                                      3.548e-02  2.164e-02
## MBERARBO50-62%                                      5.368e-02  2.787e-02
## MBERARBO63-75%                                      4.685e-02  3.538e-02
## MBERARBO76-88%                                      7.867e-02  4.812e-02
## MBERARBO89-99%                                      7.103e-02  7.437e-02
## MSKA1-10%                                          -9.055e-03  1.230e-02
## MSKA100%                                           -4.077e-02  6.623e-02
## MSKA11-23%                                         -3.016e-03  1.429e-02
## MSKA24-36%                                          1.413e-02  1.936e-02
## MSKA37-49%                                          2.906e-02  2.683e-02
## MSKA50-62%                                          3.756e-02  3.348e-02
## MSKA63-75%                                          4.158e-02  4.405e-02
## MSKA76-88%                                          4.277e-02  5.206e-02
## MSKA89-99%                                         -7.080e-02  9.011e-02
## MSKB11-10%                                          6.049e-03  1.089e-02
## MSKB1100%                                          -2.212e-02  8.540e-02
## MSKB111-23%                                        -1.238e-02  1.273e-02
## MSKB124-36%                                         2.062e-03  1.749e-02
## MSKB137-49%                                        -4.075e-03  2.488e-02
## MSKB150-62%                                        -7.627e-03  3.679e-02
## MSKB163-75%                                         9.553e-02  4.954e-02
## MSKB176-88%                                        -1.305e-01  9.221e-02
## MSKB189-99%                                         9.409e-02  8.636e-02
## MSKB21-10%                                         -1.528e-02  1.280e-02
## MSKB2100%                                          -2.061e-02  1.539e-01
## MSKB211-23%                                        -7.867e-03  1.388e-02
## MSKB224-36%                                        -1.074e-02  1.776e-02
## MSKB237-49%                                         6.699e-03  2.335e-02
## MSKB250-62%                                        -5.678e-03  2.881e-02
## MSKB263-75%                                         3.634e-02  3.818e-02
## MSKB276-88%                                        -4.010e-02  9.135e-02
## MSKB289-99%                                        -1.338e-01  1.020e-01
## MSKC1-10%                                           2.432e-02  2.331e-02
## MSKC100%                                            2.912e-02  5.382e-02
## MSKC11-23%                                          4.128e-02  2.152e-02
## MSKC24-36%                                          3.834e-02  2.411e-02
## MSKC37-49%                                          3.948e-02  2.780e-02
## MSKC50-62%                                          2.916e-02  3.198e-02
## MSKC63-75%                                          4.373e-02  3.785e-02
## MSKC76-88%                                          5.182e-02  4.356e-02
## MSKC89-99%                                          8.802e-02  5.280e-02
## MSKD1-10%                                           2.490e-03  8.993e-03
## MSKD100%                                            5.845e-02  2.408e-01
## MSKD11-23%                                          3.593e-03  1.212e-02
## MSKD24-36%                                         -1.540e-02  1.761e-02
## MSKD37-49%                                          1.271e-02  2.581e-02
## MSKD50-62%                                         -2.362e-03  3.564e-02
## MSKD63-75%                                          1.482e-02  5.580e-02
## MSKD76-88%                                          6.282e-02  7.284e-02
## MSKD89-99%                                          2.913e-02  2.419e-01
## MHHUUR1-10%                                        -1.551e-02  1.408e-02
## MHHUUR100%                                         -1.110e-02  1.393e-02
## MHHUUR11-23%                                       -6.876e-02  2.326e-01
## MHHUUR24-36%                                       -1.213e-01  2.481e-01
## MHHUUR37-49%                                       -1.222e-01  2.598e-01
## MHHUUR50-62%                                       -1.045e-01  2.687e-01
## MHHUUR63-75%                                       -8.130e-02  3.566e-01
## MHHUUR76-88%                                       -1.932e-01  4.264e-01
## MHHUUR89-99%                                       -2.224e-01  4.495e-01
## MHKOOP1-10%                                         2.201e-01  4.494e-01
## MHKOOP100%                                                 NA         NA
## MHKOOP11-23%                                        1.650e-01  4.264e-01
## MHKOOP24-36%                                        7.299e-02  3.567e-01
## MHKOOP37-49%                                        7.625e-02  2.688e-01
## MHKOOP50-62%                                        9.805e-02  2.597e-01
## MHKOOP63-75%                                        9.361e-02  2.481e-01
## MHKOOP76-88%                                        3.115e-02  2.325e-01
## MHKOOP89-99%                                               NA         NA
## MAUT1.1-10%                                         2.765e-03  2.417e-01
## MAUT1.100%                                          8.453e-02  1.823e-01
## MAUT1.11-23%                                       -8.342e-03  1.750e-01
## MAUT1.24-36%                                        3.111e-02  1.735e-01
## MAUT1.37-49%                                        5.391e-02  1.729e-01
## MAUT1.50-62%                                        8.230e-02  1.746e-01
## MAUT1.63-75%                                        8.029e-02  1.767e-01
## MAUT1.76-88%                                        8.452e-02  1.784e-01
## MAUT1.89-99%                                        8.894e-02  1.807e-01
## MAUT2.1-10%                                         1.161e-02  1.026e-02
## MAUT2.100%                                         -3.823e-02  3.040e-01
## MAUT2.11-23%                                        1.780e-02  1.441e-02
## MAUT2.24-36%                                        6.489e-03  2.271e-02
## MAUT2.37-49%                                        3.114e-02  3.219e-02
## MAUT2.50-62%                                        1.465e-02  4.754e-02
## MAUT2.63-75%                                        5.368e-02  8.424e-02
## MAUT2.76-88%                                        6.155e-02  1.933e-01
## MAUT01-10%                                         -6.401e-03  1.384e-02
## MAUT0100%                                           3.978e-02  1.956e-01
## MAUT011-23%                                         2.009e-02  1.588e-02
## MAUT024-36%                                         2.392e-02  2.276e-02
## MAUT037-49%                                         2.801e-03  3.065e-02
## MAUT050-62%                                         4.028e-02  4.182e-02
## MAUT063-75%                                         8.317e-02  5.264e-02
## MAUT076-88%                                         7.742e-02  7.776e-02
## MAUT089-99%                                         1.347e-01  1.976e-01
## MZFONDS.1-10%                                      -8.008e-02  6.308e-02
## MZFONDS.100%                                       -2.441e-02  3.308e-02
## MZFONDS.11-23%                                     -5.214e-02  3.225e-02
## MZFONDS.24-36%                                     -2.539e-02  3.366e-02
## MZFONDS.37-49%                                      9.144e-02  2.564e-01
## MZFONDS.50-62%                                     -1.105e-03  2.786e-01
## MZFONDS.63-75%                                     -4.252e-02  2.870e-01
## MZFONDS.76-88%                                     -2.273e-02  3.224e-02
## MZFONDS.89-99%                                     -2.456e-02  3.348e-02
## MZPART1-10%                                                NA         NA
## MZPART100%                                                 NA         NA
## MZPART11-23%                                               NA         NA
## MZPART24-36%                                       -1.714e-02  2.855e-01
## MZPART37-49%                                       -2.586e-02  2.771e-01
## MZPART50-62%                                       -1.304e-01  2.550e-01
## MZPART63-75%                                               NA         NA
## MZPART76-88%                                               NA         NA
## MZPART89-99%                                               NA         NA
## MINKM30.1-10%                                       9.086e-03  1.254e-02
## MINKM30.100%                                       -9.306e-02  6.673e-02
## MINKM30.11-23%                                      2.199e-02  1.426e-02
## MINKM30.24-36%                                     -1.025e-02  1.997e-02
## MINKM30.37-49%                                     -2.923e-02  2.613e-02
## MINKM30.50-62%                                     -3.168e-02  3.228e-02
## MINKM30.63-75%                                     -2.366e-02  3.862e-02
## MINKM30.76-88%                                     -2.908e-02  4.724e-02
## MINKM30.89-99%                                     -7.773e-02  5.982e-02
## MINK30451-10%                                      -2.220e-02  2.035e-02
## MINK3045100%                                       -5.658e-03  5.780e-02
## MINK304511-23%                                     -2.311e-02  1.856e-02
## MINK304524-36%                                     -3.246e-02  2.169e-02
## MINK304537-49%                                     -4.032e-02  2.567e-02
## MINK304550-62%                                     -6.267e-02  3.158e-02
## MINK304563-75%                                     -7.284e-02  3.791e-02
## MINK304576-88%                                     -7.188e-02  4.420e-02
## MINK304589-99%                                     -9.922e-02  6.043e-02
## MINK45751-10%                                       5.497e-03  1.349e-02
## MINK4575100%                                       -6.824e-02  5.736e-02
## MINK457511-23%                                     -8.905e-03  1.442e-02
## MINK457524-36%                                     -9.462e-03  1.847e-02
## MINK457537-49%                                     -1.422e-02  2.331e-02
## MINK457550-62%                                     -3.474e-02  2.997e-02
## MINK457563-75%                                     -5.458e-02  3.735e-02
## MINK457576-88%                                     -8.472e-02  4.585e-02
## MINK457589-99%                                     -9.064e-02  5.473e-02
## MINK75121-10%                                       3.387e-03  8.714e-03
## MINK7512100%                                        5.027e-02  1.215e-01
## MINK751211-23%                                     -1.378e-02  1.345e-02
## MINK751224-36%                                     -3.751e-02  2.250e-02
## MINK751237-49%                                     -2.743e-02  3.214e-02
## MINK751250-62%                                     -6.366e-02  4.398e-02
## MINK751263-75%                                     -1.493e-01  9.274e-02
## MINK751276-88%                                     -9.404e-02  1.912e-01
## MINK751289-99%                                     -2.251e-01  1.252e-01
## MINK123M1-10%                                      -2.428e-02  1.069e-02
## MINK123M100%                                       -2.189e-02  2.700e-01
## MINK123M11-23%                                     -4.618e-02  2.787e-02
## MINK123M24-36%                                     -8.485e-02  5.039e-02
## MINK123M37-49%                                     -1.160e-01  6.881e-02
## MINK123M50-62%                                     -1.088e-01  1.358e-01
## MINK123M63-75%                                     -1.427e-01  1.890e-01
## MINK123M76-88%                                     -2.650e-01  2.571e-01
## MINKGEM1-10%                                        7.751e-02  9.526e-02
## MINKGEM100%                                         5.989e-02  6.668e-02
## MINKGEM11-23%                                       1.341e-02  8.412e-02
## MINKGEM24-36%                                       8.980e-03  8.113e-02
## MINKGEM37-49%                                       2.924e-02  7.863e-02
## MINKGEM50-62%                                       3.098e-02  7.588e-02
## MINKGEM63-75%                                       2.802e-02  7.306e-02
## MINKGEM76-88%                                       7.588e-02  7.178e-02
## MINKGEM89-99%                                       4.246e-02  6.664e-02
## MKOOPKLA11-23%                                     -1.424e-02  4.140e-02
## MKOOPKLA24-36%                                     -3.817e-02  7.116e-02
## MKOOPKLA37-49%                                     -2.413e-02  7.391e-02
## MKOOPKLA50-62%                                      3.339e-02  8.258e-02
## MKOOPKLA63-75%                                      1.350e-01  1.157e-01
## MKOOPKLA76-88%                                      1.465e-01  1.208e-01
## MKOOPKLA89-99%                                      9.216e-02  1.256e-01
## PWAPART1-49                                         2.285e-01  9.033e-02
## PWAPART100-199                                      4.698e-01  1.425e-01
## PWAPART50-99                                        2.504e-01  8.959e-02
## PWABEDR1-49                                         2.331e-02  1.035e-01
## PWABEDR100-199                                      8.024e-02  7.979e-02
## PWABEDR1000-4999                                   -2.001e-02  1.866e-01
## PWABEDR200-499                                     -3.279e-02  8.677e-02
## PWABEDR50-99                                        2.396e-02  7.619e-02
## PWABEDR500-999                                      3.789e-02  1.379e-01
## PWALAND1-49                                         1.008e-03  1.647e-01
## PWALAND100-199                                     -4.493e-02  2.773e-02
## PWALAND200-499                                     -5.563e-02  2.669e-02
## PWALAND50-99                                       -6.268e-02  8.956e-02
## PPERSAUT0                                           3.098e-01  2.679e-01
## PPERSAUT10,000-19,999                               6.939e-02  2.755e-01
## PPERSAUT1000-4999                                   3.436e-01  2.618e-01
## PPERSAUT200-499                                     2.426e-01  2.864e-01
## PPERSAUT500-999                                     2.813e-01  2.623e-01
## PPERSAUT5000-9999                                   2.493e-01  2.592e-01
## PBESAUT1000-4999                                    4.970e-02  9.161e-02
## PBESAUT500-999                                      1.680e-02  9.900e-02
## PBESAUT5000-9999                                    1.313e-01  3.304e-01
## PMOTSCO100-199                                      3.941e-01  1.222e-01
## PMOTSCO1000-4999                                   -5.896e-02  4.924e-02
## PMOTSCO200-499                                     -2.006e-02  3.969e-02
## PMOTSCO500-999                                     -1.447e-02  5.079e-02
## PMOTSCO5000-9999                                   -1.228e-01  2.440e-01
## PVRAAUT0                                            2.381e-02  3.746e-01
## PVRAAUT1000-4999                                   -3.093e-02  2.965e-01
## PVRAAUT200-499                                     -1.743e-02  3.870e-01
## PVRAAUT5000-9999                                   -4.844e-02  3.004e-01
## PAANHANG1-49                                        4.514e-02  1.746e-01
## PAANHANG100-199                                     3.853e-02  2.643e-01
## PAANHANG200-499                                     7.260e-02  5.583e-01
## PAANHANG50-99                                       8.093e-02  1.715e-01
## PAANHANG500-999                                    -4.178e-02  5.649e-01
## PTRACTOR100-199                                     1.339e-02  3.317e-02
## PTRACTOR1000-4999                                   1.135e-01  8.866e-02
## PTRACTOR200-499                                     1.052e-02  5.333e-02
## PTRACTOR500-999                                     6.977e-02  6.388e-02
## PTRACTOR5000-9999                                   9.889e-02  3.119e-01
## PWERKT1-49                                         -3.861e-03  2.494e-01
## PWERKT100-199                                      -9.462e-02  1.223e-01
## PWERKT1000-4999                                    -6.927e-02  2.689e-01
## PWERKT200-499                                      -7.346e-02  1.405e-01
## PWERKT50-99                                        -4.625e-02  1.239e-01
## PBROM100-199                                        6.326e-02  4.926e-02
## PBROM1000-4999                                      3.371e-02  1.713e-01
## PBROM200-499                                        6.950e-02  5.942e-02
## PBROM50-99                                          7.279e-02  5.579e-02
## PBROM500-999                                        7.594e-02  6.865e-02
## PLEVEN0                                             1.143e-01  3.018e-01
## PLEVEN1-49                                          2.906e-02  3.033e-01
## PLEVEN10,000-19,999                                 1.138e-01  3.949e-01
## PLEVEN100-199                                       7.795e-02  2.947e-01
## PLEVEN1000-4999                                     7.958e-04  2.955e-01
## PLEVEN200-499                                       6.199e-02  2.940e-01
## PLEVEN50-99                                         5.517e-02  2.972e-01
## PLEVEN500-999                                       6.305e-02  2.959e-01
## PLEVEN5000-9999                                    -9.406e-02  3.160e-01
## PPERSONG1-49                                        7.271e-02  9.475e-02
## PPERSONG100-199                                    -5.244e-02  8.831e-02
## PPERSONG1000-4999                                  -3.274e-02  1.827e-01
## PPERSONG200-499                                     1.946e-02  1.176e-01
## PPERSONG50-99                                      -4.603e-02  4.942e-02
## PPERSONG500-999                                    -2.693e-03  1.646e-01
## PGEZONG100-199                                      1.749e-01  4.432e-02
## PGEZONG50-99                                       -1.535e-02  3.397e-02
## PWAOREG1000-4999                                    1.799e-01  1.525e-01
## PWAOREG200-499                                     -1.432e-01  2.674e-01
## PWAOREG500-999                                      1.330e-01  2.660e-01
## PWAOREG5000-9999                                    1.330e-01  1.823e-01
## PBRAND1-49                                         -2.770e-03  1.997e-02
## PBRAND10,000-19,999                                 1.973e-02  1.741e-01
## PBRAND100-199                                       3.080e-02  1.467e-02
## PBRAND1000-4999                                     2.133e-02  2.388e-02
## PBRAND200-499                                       5.740e-02  1.545e-02
## PBRAND50-99                                        -4.687e-03  1.575e-02
## PBRAND500-999                                       4.340e-02  2.159e-02
## PBRAND5000-9999                                    -1.943e-02  7.604e-02
## PZEILPL1-49                                         5.060e-02  1.439e-01
## PZEILPL100-199                                     -8.701e-02  2.368e-01
## PZEILPL50-99                                        3.670e-01  1.085e-01
## PPLEZIER1-49                                        4.033e-01  1.517e-01
## PPLEZIER100-199                                     3.107e-01  1.516e-01
## PPLEZIER1000-4999                                   5.612e-01  1.906e-01
## PPLEZIER200-499                                     1.558e-01  1.548e-01
## PPLEZIER50-99                                       6.475e-02  1.563e-01
## PPLEZIER500-999                                    -1.393e-01  1.654e-01
## PFIETS1-49                                         -4.423e-03  4.195e-02
## PINBOED1-49                                         1.032e-01  2.589e-01
## PINBOED100-199                                     -4.575e-02  2.705e-01
## PINBOED1000-4999                                   -2.306e-02  3.041e-01
## PINBOED200-499                                     -3.807e-02  3.050e-01
## PINBOED50-99                                        7.115e-02  2.594e-01
## PINBOED500-999                                      7.207e-03  3.481e-01
## PBYSTAND100-199                                     1.283e-01  2.467e-01
## PBYSTAND200-499                                     1.963e-01  2.375e-01
## PBYSTAND50-99                                       1.195e-01  2.397e-01
## PBYSTAND500-999                                     6.331e-02  2.920e-01
## AWAPART                                            -2.344e-01  8.926e-02
## AWABEDR                                            -1.299e-02  6.850e-02
## AWALAND                                                    NA         NA
## APERSAUT                                            2.565e-02  1.136e-02
## ABESAUT                                            -5.844e-02  8.124e-02
## AMOTSCO                                             1.661e-02  3.588e-02
## AVRAAUT                                             9.262e-03  9.845e-02
## AAANHANG                                           -3.544e-02  1.688e-01
## ATRACTOR                                           -2.151e-02  2.516e-02
## AWERKT                                              2.677e-02  7.963e-02
## ABROM                                              -6.291e-02  4.708e-02
## ALEVEN                                              3.843e-02  1.445e-02
## APERSONG                                                   NA         NA
## AGEZONG                                                    NA         NA
## AWAOREG                                            -7.990e-02  1.281e-01
## ABRAND                                             -1.203e-02  1.229e-02
## AZEILPL                                                    NA         NA
## APLEZIER                                            7.020e-02  1.182e-01
## AFIETS                                              2.063e-02  3.143e-02
## AINBOED                                            -7.776e-02  2.556e-01
## ABYSTAND                                           -1.111e-01  2.358e-01
##                                                    t value Pr(>|t|)    
## (Intercept)                                         -1.317 0.187848    
## MOSTYPE.Affluent young families                      1.933 0.053291 .  
## MOSTYPE.Career and childcare                         1.974 0.048465 *  
## MOSTYPE.Couples with teens 'Married with children'   2.423 0.015413 *  
## MOSTYPE.Dinki's (Double income no kids)              0.001 0.999087    
## MOSTYPE.Ethnically diverse                           1.689 0.091232 .  
## MOSTYPE.Family starters                              0.593 0.552864    
## MOSTYPE.Fresh masters in the city                   -0.036 0.971152    
## MOSTYPE.High Income, expensive child                 1.847 0.064746 .  
## MOSTYPE.High status seniors                          1.529 0.126318    
## MOSTYPE.Large family farms                           1.688 0.091511 .  
## MOSTYPE.Large family,employed child                  1.090 0.275620    
## MOSTYPE.Large religous families                      1.812 0.069951 .  
## MOSTYPE.Low income catholics                         1.456 0.145434    
## MOSTYPE.Lower class large families                   2.425 0.015338 *  
## MOSTYPE.Middle class families                        1.336 0.181538    
## MOSTYPE.Mixed apartment dwellers                     1.490 0.136324    
## MOSTYPE.Mixed rurals                                 2.187 0.028743 *  
## MOSTYPE.Mixed seniors                                1.722 0.085120 .  
## MOSTYPE.Mixed seniors2                               1.485 0.137627    
## MOSTYPE.Mixed small town dwellers                    2.587 0.009699 ** 
## MOSTYPE.Modern, complete families                    1.841 0.065697 .  
## MOSTYPE.Own home elderly                             1.423 0.154689    
## MOSTYPE.Porchless seniors: no front yard             2.360 0.018294 *  
## MOSTYPE.Religious elderly singles                    1.663 0.096440 .  
## MOSTYPE.Residential elderly                          0.602 0.547229    
## MOSTYPE.Senior cosmopolitans                         0.926 0.354719    
## MOSTYPE.Seniors in apartments                        1.121 0.262337    
## MOSTYPE.Single youth                                 1.404 0.160340    
## MOSTYPE.Stable family                                1.516 0.129439    
## MOSTYPE.Students in apartments                       1.036 0.300392    
## MOSTYPE.Suburban youth                               1.261 0.207435    
## MOSTYPE.Traditional families                         2.361 0.018223 *  
## MOSTYPE.Very Important Provincials                   0.720 0.471455    
## MOSTYPE.Village families                             1.552 0.120613    
## MOSTYPE.Yound seniros in the city                    1.615 0.106404    
## MOSTYPE.Young all american family                    0.884 0.376887    
## MOSTYPE.Young and rising                             1.858 0.063255 .  
## MOSTYPE.Young urban have-nots                        1.459 0.144628    
## MOSTYPE.Young, low educated                          1.641 0.100877    
## MAANTHUI                                             1.205 0.228339    
## MGEMOMV                                              0.425 0.670694    
## MGEMLEEF30-40 years                                  0.898 0.369303    
## MGEMLEEF40-50 years                                  0.812 0.416958    
## MGEMLEEF50-60 years                                  0.825 0.409612    
## MGEMLEEF60-70 years                                  0.725 0.468311    
## MGEMLEEF70-80 years                                  0.504 0.614622    
## MOSHOOFDCareer Loners                                   NA       NA    
## MOSHOOFDConservatie Families                            NA       NA    
## MOSHOOFDCruising Seniors                                NA       NA    
## MOSHOOFDDriven Growers                                  NA       NA    
## MOSHOOFDFamily with grown ups                           NA       NA    
## MOSHOOFDFarmers                                         NA       NA    
## MOSHOOFDLiving well                                     NA       NA    
## MOSHOOFDRetired and Religious                           NA       NA    
## MOSHOOFDSuccessful hedonists                            NA       NA    
## MGODRK1-10%                                          2.460 0.013910 *  
## MGODRK100%                                           0.196 0.844954    
## MGODRK11-23%                                         0.815 0.415081    
## MGODRK24-36%                                        -0.109 0.912921    
## MGODRK37-49%                                        -0.182 0.855773    
## MGODRK50-62%                                         0.108 0.913776    
## MGODRK63-75%                                         1.024 0.305658    
## MGODRK76-88%                                         0.860 0.390024    
## MGODRK89-99%                                         0.102 0.918925    
## MGODPR1-10%                                         -0.101 0.919520    
## MGODPR100%                                           1.031 0.302706    
## MGODPR11-23%                                         1.299 0.194021    
## MGODPR24-36%                                         1.043 0.297040    
## MGODPR37-49%                                         0.949 0.342804    
## MGODPR50-62%                                         1.364 0.172521    
## MGODPR63-75%                                         1.123 0.261485    
## MGODPR76-88%                                         1.578 0.114501    
## MGODPR89-99%                                         1.346 0.178245    
## MGODOV1-10%                                         -1.146 0.251825    
## MGODOV11-23%                                         0.742 0.458366    
## MGODOV24-36%                                         1.303 0.192763    
## MGODOV37-49%                                         0.436 0.662494    
## MGODOV50-62%                                         1.288 0.197618    
## MGODGE1-10%                                          1.245 0.213065    
## MGODGE100%                                           0.796 0.426041    
## MGODGE11-23%                                         1.087 0.276851    
## MGODGE24-36%                                         0.779 0.436098    
## MGODGE37-49%                                         0.866 0.386488    
## MGODGE50-62%                                         0.932 0.351601    
## MGODGE63-75%                                         0.210 0.833711    
## MGODGE76-88%                                         0.414 0.678556    
## MGODGE89-99%                                         0.414 0.678576    
## MRELGE1-10%                                          0.198 0.842871    
## MRELGE100%                                           0.891 0.373156    
## MRELGE11-23%                                        -0.409 0.682855    
## MRELGE24-36%                                        -0.028 0.977461    
## MRELGE37-49%                                        -0.004 0.997023    
## MRELGE50-62%                                         0.269 0.787885    
## MRELGE63-75%                                         0.437 0.662001    
## MRELGE76-88%                                         0.471 0.637563    
## MRELGE89-99%                                         0.839 0.401273    
## MRELSA1-10%                                         -0.779 0.435844    
## MRELSA11-23%                                         0.374 0.708556    
## MRELSA24-36%                                         0.492 0.622952    
## MRELSA37-49%                                         0.720 0.471790    
## MRELSA50-62%                                         0.227 0.820293    
## MRELSA63-75%                                         0.049 0.961133    
## MRELSA76-88%                                         0.145 0.884322    
## MRELOV1-10%                                          1.016 0.309516    
## MRELOV100%                                           1.132 0.257481    
## MRELOV11-23%                                         1.974 0.048362 *  
## MRELOV24-36%                                         1.890 0.058852 .  
## MRELOV37-49%                                         1.311 0.189854    
## MRELOV50-62%                                         1.053 0.292565    
## MRELOV63-75%                                         0.593 0.552938    
## MRELOV76-88%                                         0.390 0.696619    
## MRELOV89-99%                                         0.532 0.594694    
## MFALLEEN1-10%                                       -0.503 0.615253    
## MFALLEEN100%                                        -0.690 0.490195    
## MFALLEEN11-23%                                      -1.318 0.187493    
## MFALLEEN24-36%                                      -0.798 0.424673    
## MFALLEEN37-49%                                      -0.801 0.423187    
## MFALLEEN50-62%                                      -0.220 0.825961    
## MFALLEEN63-75%                                      -0.655 0.512331    
## MFALLEEN76-88%                                      -0.970 0.331944    
## MFALLEEN89-99%                                      -0.091 0.927648    
## MFGEKIND1-10%                                       -0.020 0.983721    
## MFGEKIND100%                                        -0.628 0.530014    
## MFGEKIND11-23%                                       0.472 0.636969    
## MFGEKIND24-36%                                      -0.377 0.706051    
## MFGEKIND37-49%                                      -0.412 0.680654    
## MFGEKIND50-62%                                      -1.024 0.305657    
## MFGEKIND63-75%                                      -1.492 0.135786    
## MFGEKIND76-88%                                      -0.673 0.500893    
## MFGEKIND89-99%                                      -0.606 0.544544    
## MFWEKIND1-10%                                       -0.606 0.544784    
## MFWEKIND100%                                        -1.124 0.260898    
## MFWEKIND11-23%                                       0.069 0.945183    
## MFWEKIND24-36%                                      -0.487 0.625955    
## MFWEKIND37-49%                                      -0.766 0.443734    
## MFWEKIND50-62%                                      -1.119 0.263085    
## MFWEKIND63-75%                                      -1.071 0.284137    
## MFWEKIND76-88%                                      -0.863 0.388425    
## MFWEKIND89-99%                                      -0.855 0.392360    
## MOPLHOOG1-10%                                        1.973 0.048530 *  
## MOPLHOOG100%                                        -0.154 0.877983    
## MOPLHOOG11-23%                                       1.152 0.249168    
## MOPLHOOG24-36%                                       0.039 0.969060    
## MOPLHOOG37-49%                                       0.536 0.592281    
## MOPLHOOG50-62%                                       0.991 0.321619    
## MOPLHOOG63-75%                                       1.097 0.272649    
## MOPLHOOG76-88%                                       0.529 0.596556    
## MOPLHOOG89-99%                                      -0.965 0.334553    
## MOPLMIDD1-10%                                        0.977 0.328764    
## MOPLMIDD100%                                        -0.094 0.925164    
## MOPLMIDD11-23%                                      -0.127 0.899281    
## MOPLMIDD24-36%                                      -0.367 0.713273    
## MOPLMIDD37-49%                                      -0.149 0.881681    
## MOPLMIDD50-62%                                      -0.021 0.983049    
## MOPLMIDD63-75%                                      -0.110 0.912295    
## MOPLMIDD76-88%                                       0.357 0.721323    
## MOPLMIDD89-99%                                       0.036 0.971263    
## MOPLLAAG1-10%                                       -0.601 0.548023    
## MOPLLAAG100%                                        -0.875 0.381741    
## MOPLLAAG11-23%                                      -1.188 0.234904    
## MOPLLAAG24-36%                                      -1.215 0.224424    
## MOPLLAAG37-49%                                      -1.008 0.313626    
## MOPLLAAG50-62%                                      -0.924 0.355583    
## MOPLLAAG63-75%                                      -0.919 0.357935    
## MOPLLAAG76-88%                                      -1.132 0.257670    
## MOPLLAAG89-99%                                      -1.042 0.297404    
## MBERHOOG1-10%                                        0.062 0.950906    
## MBERHOOG100%                                         1.574 0.115474    
## MBERHOOG11-23%                                       1.149 0.250673    
## MBERHOOG24-36%                                       1.151 0.249580    
## MBERHOOG37-49%                                       1.315 0.188563    
## MBERHOOG50-62%                                       0.563 0.573477    
## MBERHOOG63-75%                                       0.420 0.674717    
## MBERHOOG76-88%                                       0.553 0.580000    
## MBERHOOG89-99%                                       1.862 0.062697 .  
## MBERZELF1-10%                                        1.064 0.287466    
## MBERZELF11-23%                                       1.172 0.241121    
## MBERZELF24-36%                                       0.947 0.343711    
## MBERZELF37-49%                                       0.649 0.516032    
## MBERZELF50-62%                                       1.542 0.123170    
## MBERBOER1-10%                                       -0.382 0.702754    
## MBERBOER100%                                         0.420 0.674595    
## MBERBOER11-23%                                      -0.122 0.902992    
## MBERBOER24-36%                                       0.605 0.545151    
## MBERBOER37-49%                                       0.222 0.824659    
## MBERBOER50-62%                                       0.817 0.413780    
## MBERBOER63-75%                                       0.559 0.576333    
## MBERBOER76-88%                                       0.292 0.769937    
## MBERBOER89-99%                                       0.872 0.383260    
## MBERMIDD1-10%                                       -0.310 0.756757    
## MBERMIDD100%                                         1.644 0.100115    
## MBERMIDD11-23%                                       0.558 0.576931    
## MBERMIDD24-36%                                       0.127 0.898791    
## MBERMIDD37-49%                                       1.428 0.153321    
## MBERMIDD50-62%                                       0.890 0.373415    
## MBERMIDD63-75%                                       1.877 0.060483 .  
## MBERMIDD76-88%                                       2.402 0.016322 *  
## MBERMIDD89-99%                                      -1.196 0.231585    
## MBERARBG1-10%                                        1.124 0.261153    
## MBERARBG100%                                         0.934 0.350567    
## MBERARBG11-23%                                       1.910 0.056159 .  
## MBERARBG24-36%                                       1.602 0.109115    
## MBERARBG37-49%                                       1.661 0.096702 .  
## MBERARBG50-62%                                       1.825 0.068088 .  
## MBERARBG63-75%                                       1.206 0.227673    
## MBERARBG76-88%                                       1.968 0.049087 *  
## MBERARBG89-99%                                       1.832 0.066948 .  
## MBERARBO1-10%                                        0.189 0.850302    
## MBERARBO100%                                         1.113 0.265525    
## MBERARBO11-23%                                       0.776 0.438040    
## MBERARBO24-36%                                       1.436 0.151043    
## MBERARBO37-49%                                       1.639 0.101156    
## MBERARBO50-62%                                       1.926 0.054083 .  
## MBERARBO63-75%                                       1.324 0.185467    
## MBERARBO76-88%                                       1.635 0.102096    
## MBERARBO89-99%                                       0.955 0.339542    
## MSKA1-10%                                           -0.736 0.461719    
## MSKA100%                                            -0.616 0.538148    
## MSKA11-23%                                          -0.211 0.832860    
## MSKA24-36%                                           0.729 0.465739    
## MSKA37-49%                                           1.083 0.278660    
## MSKA50-62%                                           1.122 0.261972    
## MSKA63-75%                                           0.944 0.345241    
## MSKA76-88%                                           0.822 0.411320    
## MSKA89-99%                                          -0.786 0.432040    
## MSKB11-10%                                           0.555 0.578570    
## MSKB1100%                                           -0.259 0.795596    
## MSKB111-23%                                         -0.973 0.330716    
## MSKB124-36%                                          0.118 0.906184    
## MSKB137-49%                                         -0.164 0.869916    
## MSKB150-62%                                         -0.207 0.835762    
## MSKB163-75%                                          1.928 0.053847 .  
## MSKB176-88%                                         -1.416 0.156905    
## MSKB189-99%                                          1.089 0.275973    
## MSKB21-10%                                          -1.194 0.232658    
## MSKB2100%                                           -0.134 0.893433    
## MSKB211-23%                                         -0.567 0.570756    
## MSKB224-36%                                         -0.605 0.545154    
## MSKB237-49%                                          0.287 0.774151    
## MSKB250-62%                                         -0.197 0.843760    
## MSKB263-75%                                          0.952 0.341112    
## MSKB276-88%                                         -0.439 0.660702    
## MSKB289-99%                                         -1.312 0.189643    
## MSKC1-10%                                            1.043 0.296862    
## MSKC100%                                             0.541 0.588494    
## MSKC11-23%                                           1.918 0.055101 .  
## MSKC24-36%                                           1.590 0.111843    
## MSKC37-49%                                           1.420 0.155656    
## MSKC50-62%                                           0.912 0.361998    
## MSKC63-75%                                           1.155 0.247972    
## MSKC76-88%                                           1.190 0.234247    
## MSKC89-99%                                           1.667 0.095527 .  
## MSKD1-10%                                            0.277 0.781903    
## MSKD100%                                             0.243 0.808253    
## MSKD11-23%                                           0.296 0.767004    
## MSKD24-36%                                          -0.874 0.382036    
## MSKD37-49%                                           0.492 0.622382    
## MSKD50-62%                                          -0.066 0.947165    
## MSKD63-75%                                           0.266 0.790520    
## MSKD76-88%                                           0.862 0.388495    
## MSKD89-99%                                           0.120 0.904172    
## MHHUUR1-10%                                         -1.102 0.270630    
## MHHUUR100%                                          -0.797 0.425456    
## MHHUUR11-23%                                        -0.296 0.767515    
## MHHUUR24-36%                                        -0.489 0.624937    
## MHHUUR37-49%                                        -0.470 0.638064    
## MHHUUR50-62%                                        -0.389 0.697447    
## MHHUUR63-75%                                        -0.228 0.819638    
## MHHUUR76-88%                                        -0.453 0.650400    
## MHHUUR89-99%                                        -0.495 0.620771    
## MHKOOP1-10%                                          0.490 0.624337    
## MHKOOP100%                                              NA       NA    
## MHKOOP11-23%                                         0.387 0.698851    
## MHKOOP24-36%                                         0.205 0.837859    
## MHKOOP37-49%                                         0.284 0.776689    
## MHKOOP50-62%                                         0.378 0.705764    
## MHKOOP63-75%                                         0.377 0.705926    
## MHKOOP76-88%                                         0.134 0.893425    
## MHKOOP89-99%                                            NA       NA    
## MAUT1.1-10%                                          0.011 0.990873    
## MAUT1.100%                                           0.464 0.642983    
## MAUT1.11-23%                                        -0.048 0.961985    
## MAUT1.24-36%                                         0.179 0.857707    
## MAUT1.37-49%                                         0.312 0.755196    
## MAUT1.50-62%                                         0.471 0.637461    
## MAUT1.63-75%                                         0.454 0.649573    
## MAUT1.76-88%                                         0.474 0.635692    
## MAUT1.89-99%                                         0.492 0.622682    
## MAUT2.1-10%                                          1.131 0.258001    
## MAUT2.100%                                          -0.126 0.899905    
## MAUT2.11-23%                                         1.235 0.216756    
## MAUT2.24-36%                                         0.286 0.775126    
## MAUT2.37-49%                                         0.967 0.333451    
## MAUT2.50-62%                                         0.308 0.757866    
## MAUT2.63-75%                                         0.637 0.523958    
## MAUT2.76-88%                                         0.318 0.750222    
## MAUT01-10%                                          -0.462 0.643846    
## MAUT0100%                                            0.203 0.838880    
## MAUT011-23%                                          1.265 0.205959    
## MAUT024-36%                                          1.051 0.293332    
## MAUT037-49%                                          0.091 0.927182    
## MAUT050-62%                                          0.963 0.335449    
## MAUT063-75%                                          1.580 0.114143    
## MAUT076-88%                                          0.996 0.319443    
## MAUT089-99%                                          0.682 0.495371    
## MZFONDS.1-10%                                       -1.270 0.204286    
## MZFONDS.100%                                        -0.738 0.460642    
## MZFONDS.11-23%                                      -1.617 0.105925    
## MZFONDS.24-36%                                      -0.754 0.450583    
## MZFONDS.37-49%                                       0.357 0.721384    
## MZFONDS.50-62%                                      -0.004 0.996837    
## MZFONDS.63-75%                                      -0.148 0.882198    
## MZFONDS.76-88%                                      -0.705 0.480746    
## MZFONDS.89-99%                                      -0.734 0.463146    
## MZPART1-10%                                             NA       NA    
## MZPART100%                                              NA       NA    
## MZPART11-23%                                            NA       NA    
## MZPART24-36%                                        -0.060 0.952115    
## MZPART37-49%                                        -0.093 0.925634    
## MZPART50-62%                                        -0.511 0.609172    
## MZPART63-75%                                            NA       NA    
## MZPART76-88%                                            NA       NA    
## MZPART89-99%                                            NA       NA    
## MINKM30.1-10%                                        0.724 0.468823    
## MINKM30.100%                                        -1.395 0.163153    
## MINKM30.11-23%                                       1.542 0.123127    
## MINKM30.24-36%                                      -0.513 0.607772    
## MINKM30.37-49%                                      -1.118 0.263429    
## MINKM30.50-62%                                      -0.981 0.326421    
## MINKM30.63-75%                                      -0.613 0.540175    
## MINKM30.76-88%                                      -0.616 0.538147    
## MINKM30.89-99%                                      -1.299 0.193851    
## MINK30451-10%                                       -1.091 0.275226    
## MINK3045100%                                        -0.098 0.922020    
## MINK304511-23%                                      -1.245 0.213106    
## MINK304524-36%                                      -1.496 0.134565    
## MINK304537-49%                                      -1.570 0.116337    
## MINK304550-62%                                      -1.985 0.047201 *  
## MINK304563-75%                                      -1.922 0.054696 .  
## MINK304576-88%                                      -1.626 0.103952    
## MINK304589-99%                                      -1.642 0.100657    
## MINK45751-10%                                        0.407 0.683776    
## MINK4575100%                                        -1.190 0.234158    
## MINK457511-23%                                      -0.618 0.536757    
## MINK457524-36%                                      -0.512 0.608444    
## MINK457537-49%                                      -0.610 0.541990    
## MINK457550-62%                                      -1.159 0.246349    
## MINK457563-75%                                      -1.461 0.143958    
## MINK457576-88%                                      -1.848 0.064699 .  
## MINK457589-99%                                      -1.656 0.097728 .  
## MINK75121-10%                                        0.389 0.697512    
## MINK7512100%                                         0.414 0.679182    
## MINK751211-23%                                      -1.025 0.305403    
## MINK751224-36%                                      -1.667 0.095568 .  
## MINK751237-49%                                      -0.853 0.393454    
## MINK751250-62%                                      -1.448 0.147751    
## MINK751263-75%                                      -1.610 0.107348    
## MINK751276-88%                                      -0.492 0.622783    
## MINK751289-99%                                      -1.797 0.072290 .  
## MINK123M1-10%                                       -2.271 0.023148 *  
## MINK123M100%                                        -0.081 0.935370    
## MINK123M11-23%                                      -1.657 0.097492 .  
## MINK123M24-36%                                      -1.684 0.092278 .  
## MINK123M37-49%                                      -1.686 0.091758 .  
## MINK123M50-62%                                      -0.801 0.422940    
## MINK123M63-75%                                      -0.755 0.450138    
## MINK123M76-88%                                      -1.031 0.302743    
## MINKGEM1-10%                                         0.814 0.415833    
## MINKGEM100%                                          0.898 0.369130    
## MINKGEM11-23%                                        0.159 0.873319    
## MINKGEM24-36%                                        0.111 0.911869    
## MINKGEM37-49%                                        0.372 0.709994    
## MINKGEM50-62%                                        0.408 0.683103    
## MINKGEM63-75%                                        0.384 0.701320    
## MINKGEM76-88%                                        1.057 0.290480    
## MINKGEM89-99%                                        0.637 0.523999    
## MKOOPKLA11-23%                                      -0.344 0.730907    
## MKOOPKLA24-36%                                      -0.536 0.591682    
## MKOOPKLA37-49%                                      -0.327 0.744039    
## MKOOPKLA50-62%                                       0.404 0.685966    
## MKOOPKLA63-75%                                       1.167 0.243201    
## MKOOPKLA76-88%                                       1.212 0.225366    
## MKOOPKLA89-99%                                       0.734 0.463084    
## PWAPART1-49                                          2.529 0.011447 *  
## PWAPART100-199                                       3.296 0.000984 ***
## PWAPART50-99                                         2.795 0.005200 ** 
## PWABEDR1-49                                          0.225 0.821853    
## PWABEDR100-199                                       1.006 0.314629    
## PWABEDR1000-4999                                    -0.107 0.914599    
## PWABEDR200-499                                      -0.378 0.705500    
## PWABEDR50-99                                         0.315 0.753140    
## PWABEDR500-999                                       0.275 0.783539    
## PWALAND1-49                                          0.006 0.995115    
## PWALAND100-199                                      -1.620 0.105176    
## PWALAND200-499                                      -2.084 0.037177 *  
## PWALAND50-99                                        -0.700 0.484022    
## PPERSAUT0                                            1.157 0.247478    
## PPERSAUT10,000-19,999                                0.252 0.801157    
## PPERSAUT1000-4999                                    1.312 0.189450    
## PPERSAUT200-499                                      0.847 0.397057    
## PPERSAUT500-999                                      1.072 0.283680    
## PPERSAUT5000-9999                                    0.962 0.336283    
## PBESAUT1000-4999                                     0.543 0.587474    
## PBESAUT500-999                                       0.170 0.865264    
## PBESAUT5000-9999                                     0.397 0.691089    
## PMOTSCO100-199                                       3.224 0.001266 ** 
## PMOTSCO1000-4999                                    -1.197 0.231196    
## PMOTSCO200-499                                      -0.505 0.613275    
## PMOTSCO500-999                                      -0.285 0.775768    
## PMOTSCO5000-9999                                    -0.503 0.614763    
## PVRAAUT0                                             0.064 0.949318    
## PVRAAUT1000-4999                                    -0.104 0.916898    
## PVRAAUT200-499                                      -0.045 0.964080    
## PVRAAUT5000-9999                                    -0.161 0.871896    
## PAANHANG1-49                                         0.259 0.795971    
## PAANHANG100-199                                      0.146 0.884074    
## PAANHANG200-499                                      0.130 0.896534    
## PAANHANG50-99                                        0.472 0.636979    
## PAANHANG500-999                                     -0.074 0.941046    
## PTRACTOR100-199                                      0.404 0.686540    
## PTRACTOR1000-4999                                    1.280 0.200693    
## PTRACTOR200-499                                      0.197 0.843690    
## PTRACTOR500-999                                      1.092 0.274810    
## PTRACTOR5000-9999                                    0.317 0.751169    
## PWERKT1-49                                          -0.015 0.987647    
## PWERKT100-199                                       -0.774 0.438976    
## PWERKT1000-4999                                     -0.258 0.796735    
## PWERKT200-499                                       -0.523 0.601142    
## PWERKT50-99                                         -0.373 0.709057    
## PBROM100-199                                         1.284 0.199135    
## PBROM1000-4999                                       0.197 0.843941    
## PBROM200-499                                         1.170 0.242194    
## PBROM50-99                                           1.305 0.192023    
## PBROM500-999                                         1.106 0.268733    
## PLEVEN0                                              0.379 0.704822    
## PLEVEN1-49                                           0.096 0.923661    
## PLEVEN10,000-19,999                                  0.288 0.773142    
## PLEVEN100-199                                        0.264 0.791417    
## PLEVEN1000-4999                                      0.003 0.997851    
## PLEVEN200-499                                        0.211 0.833035    
## PLEVEN50-99                                          0.186 0.852732    
## PLEVEN500-999                                        0.213 0.831261    
## PLEVEN5000-9999                                     -0.298 0.765964    
## PPERSONG1-49                                         0.767 0.442895    
## PPERSONG100-199                                     -0.594 0.552679    
## PPERSONG1000-4999                                   -0.179 0.857752    
## PPERSONG200-499                                      0.165 0.868583    
## PPERSONG50-99                                       -0.931 0.351673    
## PPERSONG500-999                                     -0.016 0.986947    
## PGEZONG100-199                                       3.947 7.98e-05 ***
## PGEZONG50-99                                        -0.452 0.651446    
## PWAOREG1000-4999                                     1.179 0.238324    
## PWAOREG200-499                                      -0.536 0.592310    
## PWAOREG500-999                                       0.500 0.617179    
## PWAOREG5000-9999                                     0.729 0.465907    
## PBRAND1-49                                          -0.139 0.889684    
## PBRAND10,000-19,999                                  0.113 0.909797    
## PBRAND100-199                                        2.100 0.035768 *  
## PBRAND1000-4999                                      0.893 0.371733    
## PBRAND200-499                                        3.716 0.000204 ***
## PBRAND50-99                                         -0.298 0.765975    
## PBRAND500-999                                        2.010 0.044429 *  
## PBRAND5000-9999                                     -0.256 0.798304    
## PZEILPL1-49                                          0.352 0.725143    
## PZEILPL100-199                                      -0.367 0.713256    
## PZEILPL50-99                                         3.381 0.000726 ***
## PPLEZIER1-49                                         2.658 0.007873 ** 
## PPLEZIER100-199                                      2.050 0.040388 *  
## PPLEZIER1000-4999                                    2.945 0.003242 ** 
## PPLEZIER200-499                                      1.006 0.314435    
## PPLEZIER50-99                                        0.414 0.678614    
## PPLEZIER500-999                                     -0.842 0.399616    
## PFIETS1-49                                          -0.105 0.916028    
## PINBOED1-49                                          0.399 0.690168    
## PINBOED100-199                                      -0.169 0.865677    
## PINBOED1000-4999                                    -0.076 0.939540    
## PINBOED200-499                                      -0.125 0.900662    
## PINBOED50-99                                         0.274 0.783877    
## PINBOED500-999                                       0.021 0.983483    
## PBYSTAND100-199                                      0.520 0.603011    
## PBYSTAND200-499                                      0.826 0.408604    
## PBYSTAND50-99                                        0.498 0.618166    
## PBYSTAND500-999                                      0.217 0.828347    
## AWAPART                                             -2.626 0.008660 ** 
## AWABEDR                                             -0.190 0.849577    
## AWALAND                                                 NA       NA    
## APERSAUT                                             2.258 0.023942 *  
## ABESAUT                                             -0.719 0.471920    
## AMOTSCO                                              0.463 0.643377    
## AVRAAUT                                              0.094 0.925049    
## AAANHANG                                            -0.210 0.833734    
## ATRACTOR                                            -0.855 0.392548    
## AWERKT                                               0.336 0.736726    
## ABROM                                               -1.336 0.181525    
## ALEVEN                                               2.659 0.007850 ** 
## APERSONG                                                NA       NA    
## AGEZONG                                                 NA       NA    
## AWAOREG                                             -0.624 0.532873    
## ABRAND                                              -0.979 0.327557    
## AZEILPL                                                 NA       NA    
## APLEZIER                                             0.594 0.552613    
## AFIETS                                               0.656 0.511632    
## AINBOED                                             -0.304 0.760991    
## ABYSTAND                                            -0.471 0.637708    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for gaussian family taken to be 0.05242127)
## 
##     Null deviance: 551.04  on 9821  degrees of freedom
## Residual deviance: 489.51  on 9338  degrees of freedom
## AIC: -612.31
## 
## Number of Fisher Scoring iterations: 2


Il est intéressant de noter que toutes les variables sauf une (primes) des polices détenues par les clients. Cela implique que les assurés de la caravane sont très susceptibles de posséder d’autres polices d’assurance. Cette constatation n’est pas surprenante et n’est pas particulièrement éclairante car les données concernent les clients existants. Il convient également de noter que les clients ayant une politique de navigation de plaisance et de planche de surf et des revenus élevés (supérieurs à 125 000) sont plus susceptibles d’avoir une politique de caravane, segment de clientèle axé sur les activités de plein air. Nous verrons ci-dessous l’inclusion de «faible niveau d’éducation», qui est un indicateur intéressant d’un autre groupe qui pourrait également avoir une assurance caravane.

Le modèle d’analyse de données exploratoire et de régression logistique (RL) présenté ici a fourni des résultats intéressants, mais aucune idée nouvelle susceptible d’avoir un impact significatif sur le marketing ciblé de l’assurance caravane. En d’autres termes, ils soulignent l’évident; les clients avec plus de polices et (donc) de contributions, ainsi que le temps et l’argent consacrés aux activités de loisirs sont également plus susceptibles de souscrire des polices d’assurance caravane. Bien que ces informations puissent être utiles pour cibler des clients potentiels, elles sont néanmoins assez prévisibles. Pour atteindre notre objectif de recherche d’un profil de client non évident, il a été décidé de construire ou d’apprendre des modèles prédictifs. On s’attendait à ce que les informations fournies par ces modèles soient utiles pour développer une campagne marketing plus nuancée visant un profil de client (c’est-à-dire des acheteurs) qui n’aurait pas été ciblée autrement. Après visualisation de l’importance des variables dans notre classification/profil client, on va donc essayer de les ordonner suivant leurs degrés d’importance:

# ensure results are repeatable
set.seed(7)
prepared_data <- viz_data
# prepare training scheme
control <- trainControl(method="repeatedcv", number=10, repeats=3)
# train the model
model <- train(CLASS~., data=prepared_data, method="rpart", preProcess="scale", trControl=control)
# estimate variable importance
importance <- varImp(model, scale=FALSE)


Les méthodes de sélection automatique de caractéristiques peuvent être utilisées pour créer de nombreux modèles avec différents sous-ensembles d’un jeu de données et pour identifier les attributs qui sont et ne sont pas nécessaires pour créer un modèle précis.

Une méthode automatique populaire pour la sélection des fonctionnalités fournie par le package caret R est appelée élimination des fonctionnalités récursives ou RFE.

# define the control using a random forest selection function
control <- rfeControl(functions=rfFuncs, method="cv", number=10)
# run the RFE algorithm
results <- rfe(prepared_data[,1:85], prepared_data[,86], sizes=c(1:85), rfeControl=control)
# summarize the results
print(results)
# list the chosen features
predictors(results)
# plot the results
plot(results, type=c("g", "o"))


Et par surprise, l’élimination récursive des traits nous a donné un résultat se rapprochant de celui du modèle de régression logistique. (Se rapprochant, vu que par la sélection aléatoire à l’évaluation des traits, la sélection de deux traits dépendants laissera celui sélectionné en premier)

prepared_data$CLASS<-relevel(prepared_data$CLASS,"Yes")
learning.Set<-prepared_data[which(prepared_data$STATUS=="Learning"),]
test.Set<-prepared_data[which(prepared_data$STATUS=="Test"),]

learning.Set<-learning.Set[,1:86]
test.Set<-test.Set[,1:86]

modele<-glm(CLASS~ 1,data = learning.Set,family = binomial)
str_constant <- "~ 1"
str_all <- "~SD1+SD2+SD3+SD4+SD5+SD6+SD7+SD8+SD9+SD10+SD11+SD12+SD13+SD14+SD15+SD16+SD17+SD18+SD19+SD20+SD21+SD22+SD23+SD24+SD25+SD26+SD27+SD28+SD29+SD30+SD31+SD32+SD33+SD34+SD35+SD36+SD37+SD38+SD39+SD40+SD41+SD42+SD43+PO44+PO45+PO46+PO47+PO48+PO49+PO50+PO51+PO52+PO53+PO54+PO55+PO56+PO57+PO58+PO59+PO60+PO61+PO62+PO63+PO64+PO65+PO66+PO67+PO68+PO69+PO70+PO71+PO72+PO73+PO74+PO75+PO76+PO77+PO78+PO79+PO80+PO81+PO82+PO83+PO84+PO85"

mod2<-stepAIC(modele, data=learning.Set,scope = list(lower = str_constant, upper = str_all),trace = TRUE,direction = "both")
## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred

## Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred
## 
## Call:
## glm(formula = CLASS ~ PO47 + SD43 + PO59 + PO82 + SD18 + SD21 + 
##     SD10 + PO46 + PO83 + SD41 + SD42 + PO58 + SD4 + PO44 + PO85 + 
##     PO80 + PO74 + SD7 + PO79 + SD28 + SD16 + SD22, family = binomial, 
##     data = learning.Set)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -3.1647   0.1725   0.2565   0.3744   1.8826  
## 
## Coefficients:
##              Estimate Std. Error z value Pr(>|z|)    
## (Intercept)   6.19224    0.52494  11.796  < 2e-16 ***
## PO47         -0.22355    0.02431  -9.196  < 2e-16 ***
## SD43         -0.05593    0.03559  -1.572 0.116048    
## PO59         -0.24533    0.07137  -3.437 0.000587 ***
## PO82         -2.04258    0.38299  -5.333 9.65e-08 ***
## SD18          0.09275    0.04084   2.271 0.023141 *  
## SD21          0.16145    0.08213   1.966 0.049312 *  
## SD10         -0.08609    0.03637  -2.367 0.017941 *  
## PO46          0.35244    0.18342   1.922 0.054667 .  
## PO83         -0.49134    0.20112  -2.443 0.014565 *  
## SD41          0.26512    0.12196   2.174 0.029718 *  
## SD42         -0.09974    0.05618  -1.775 0.075849 .  
## PO58         -0.90496    0.58036  -1.559 0.118920    
## SD4          -0.16131    0.07647  -2.109 0.034907 *  
## PO44         -0.15607    0.07597  -2.054 0.039943 *  
## PO85         -0.51565    0.30400  -1.696 0.089846 .  
## PO80          0.45738    0.26499   1.726 0.084345 .  
## PO74         12.56588  306.81856   0.041 0.967331    
## SD7          -0.05639    0.03474  -1.623 0.104613    
## PO79          3.63288    3.31767   1.095 0.273512    
## SD28         -0.08675    0.04353  -1.993 0.046261 *  
## SD16         -0.09673    0.04644  -2.083 0.037263 *  
## SD22         -0.06828    0.03260  -2.094 0.036234 *  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 2635.5  on 5821  degrees of freedom
## Residual deviance: 2295.4  on 5799  degrees of freedom
## AIC: 2341.4
## 
## Number of Fisher Scoring iterations: 15


C’est ainsi qu’on se cèdera de ces variables pour la construction de notre modèle de classification et pour le scoring des clients pour obtenir leurs degrés d’appétance.


## Modélisation et extraction de profil On s’intéressera maintenant à la création d’un modèle performant de classification et à l’affectation du degré d’appétance à la clientèle en utilisant nos traits présélectionnés par le degré d’importance et la signification en terme de métirer. Pour ce faire, on se cèdera par une méthode de machine learning avancé, qui est le ‘model stacking’ et qui est une technique très utilisé en marketing ciblé pour l’optimisation de la sélection des profils intéressants et de l’affectation de score. Commencons donc par le Boosting, on se cèddera des deux plus importants algorithmes connus qui sont le Gradient Boosting stochastique et le C5.0:

control <- trainControl(method="repeatedcv", number=10, repeats=3)
seed <- 7
metric <- "Accuracy"
# C5.0
set.seed(seed)
fit.c50 <- train(CLASS~., data=final_features_data, method="C5.0", metric=metric, trControl=control)
# Stochastic Gradient Boosting
set.seed(seed)
fit.gbm <- train(CLASS~., data=final_features_data, method="gbm", metric=metric, trControl=control, verbose=FALSE)
# summarize results
boosting_results <- resamples(list(c5.0=fit.c50, gbm=fit.gbm))
summary(boosting_results)
dotplot(boosting_results)


On peut remarquer ainsi que l’algorithme du gradient boosting stochastique produit un modèle plus précis pour le Boosting. Passons alors au Bagging, on essaiera le Bagging CART et le Random Forest biensur.

control <- trainControl(method="repeatedcv", number=10, repeats=3)
seed <- 7
metric <- "Accuracy"
# Bagged CART
set.seed(seed)
fit.treebag <- train(CLASS~., data=final_features_data, method="treebag", metric=metric, trControl=control)
# Random Forest
set.seed(seed)
fit.rf <- train(CLASS~., data=final_features_data, method="rf", metric=metric, trControl=control)
# summarize results
bagging_results <- resamples(list(treebag=fit.treebag, rf=fit.rf))
summary(bagging_results)
dotplot(bagging_results)


Comme on peut le voir donc en exécutant le code approprié, la technique de bagging par le random forest nous produit une précision plus èlevée que le tree bagging donc on se cèdera de cette technique. Passons finalement, à la création du modèle: Étant donné une liste de modèles de caret, la fonction caretStack () peut être utilisée pour spécifier un modèle d’ordre supérieur afin d’apprendre à combiner au mieux les prédictions de sous-modèles.

Voyons d’abord la création de 5 sous-modèles pour le jeu de données sur l’ionosphère, à savoir:

Analyse linéaire discriminante (LDA) Arbres de classification et de régression (CART) Régression logistique (via modèle linéaire généralisé ou GLM) k-voisins les plus proches (kNN) Prise en charge de la machine à vecteurs avec fonction de noyau à base radiale (SVM)

control <- trainControl(method="repeatedcv", number=10, repeats=3, savePredictions=TRUE, classProbs=TRUE)
algorithmList <- c('lda', 'rpart', 'glm', 'knn', 'svmRadial')
set.seed(seed)
models <- caretList(CLASS~., data=final_features_data, trControl=control, methodList=algorithmList)
results <- resamples(models)
summary(results)
dotplot(results)


Nous pouvons voir que le SVM crée le modèle le plus précis avec une précision de 96,47%. Lorsque nous combinons les prédictions de différents modèles en utilisant l’empilement, il est souhaitable que les prédictions faites par les sous-modèles aient une faible corrélation. Cela suggérerait que les modèles sont habiles mais de différentes manières, permettant ainsi à un nouveau classificateur de déterminer comment tirer le meilleur parti de chaque modèle pour obtenir un score amélioré.

Si les prévisions pour les sous-modèles étaient fortement corrigées (> 0,75), ils feraient alors des prévisions identiques ou très similaires, réduisant le plus souvent l’avantage de la combinaison des prévisions.

# correlation between results
modelCor(results)
splom(results)


Nous pouvons voir que toutes les paires de prédictions ont généralement une faible corrélation. Les deux méthodes présentant la plus forte corrélation entre leurs prédictions sont la régression logistique (GLM) et kNN à une corrélation de 0,428, ce qui n’est pas considéré comme élevé (> 0,75). Combinons les prédictions des classificateurs à l’aide d’un modèle linéaire simple.

# stack using glm
stackControl <- trainControl(method="repeatedcv", number=10, repeats=3, savePredictions=TRUE, classProbs=TRUE)
set.seed(seed)
stack.glm <- caretStack(models, method="glm", metric="Accuracy", trControl=stackControl)
print(stack.glm)


Nous pouvons constater que nous avons porté la précision à 96,99%, ce qui représente une légère amélioration par rapport à l’utilisation de SVM seul. Ceci constitue également une amélioration par rapport à l’utilisation de la forêt aléatoire uniquement sur l’ensemble de données, comme indiqué ci-dessus. Nous pouvons également utiliser des algorithmes plus sophistiqués pour combiner les prévisions afin de déterminer le meilleur moment pour utiliser les différentes méthodes. Dans ce cas, nous pouvons utiliser l’algorithme de forêt aléatoire pour combiner les prédictions.

# stack using random forest
set.seed(seed)
stack.rf <- caretStack(models, method="rf", metric="Accuracy", trControl=stackControl)
print(stack.rf)


Nous pouvons voir que cela a élevé la précision à 98,56%, une amélioration impressionnante en RF seulement. ### Conclusion pour la modélisation et l’optimisation: Après qu’on a essayé toutes les trois méthodes de création d’un modèle optimisé pour la classification en se basant sur nos traits clientèles sélectionnés, on a trouvé que la technique du stacking est celle qui nous a donné le meilleur modèle en terme d’identification de l’appétance des clients, reste maintenant à définir un algorithme d’affectation de score qui représentera le degré d’appétance du client pour cette police d’assurance sur caravane. Pour ce faire, on s’intéressera biensûr aux traits clientels, On définira ainsi le profil clientel ciblé par le meilleur profil sélectionné ou classifié en un NON pour le trait “CLASSE” (par le meilleur profil, on veut dire le profil le plus proche à la classe “OUI” en terme de probabilité d’appartenance accordé par le modèle), après, le score ou le degré d’appétance sera pondéré par le pourcentage de vraisemblance entre ce profil et le profil du client étudié, donc s’il s’agit d’une classification en un “OUI” par notre modèle établi, on n’y modifiera pas le score, ou même on peut lui affecter un 1 comme score, vu qu’on est intéressé le plus par ceux qui sont classés en un “NON” mais ont une faible probabilité d’appartenance à leur classe, donc on teste la vraisemblance de ce profil avec le profil typique (en terme de valeur et modalités des traits sélectionnés) et on lui affecte son score.
\[ Profil = \Sigma coef(i)*var(i)[bool]\] les coefficients et les variables étant ceux définies par le modèle de la régression logistique.

S.a <- Probabilité d’appartenance (déterminée par le modèle) C.a <- Classe prédite \[ Profil = \Pi (S.a)*(1/2) + Profil(a) / Profil(typ)*(1/2)\]
## Conclusion L’analyse ci-dessus et le développement du modèle ont été réalisés afin d’obtenir un profil client typique des détenteurs de polices d’assurance Caravan. Pour atteindre cet objectif, nous avons développé des modèles utilisant des méthodes de classification différentes. Le principal objectif du développement d’un profil de client cible est de l’utiliser pour identifier correctement les clients potentiels. Un modèle permettant de les identifier avec précision augmenterait le taux de réussite d’une campagne marketing. Nous nous attendons à ce que des améliorations plus poussées dans le processus de création de modèles permettent de développer des modèles aussi perspicaces. Un moyen d’y parvenir consiste à explorer les moyens de supprimer ou de regrouper des données relatives à des polices d’assurance. Comme nous l’avons déjà noté, tous les modèles incluent des variables telles que les «stratégies de nombre de wagons» ou «stratégies de wagons de contribution». La raison en est leur prépondérance dans l’ensemble de données; environ un tiers de toutes les variables concernent le nombre ou la contribution aux politiques. Étant donné que ces deux types de variables sont essentiellement deux aspects d’un seul (type) de police d’assurance, il est possible d’en écarter une et de développer des modèles avec un ensemble de données réduit. Une analyse préliminaire de corrélation par paire entre Nombre et Contribution semble suggérer que cela serait réalisable sans perte significative d’informations. Les modèles de classificateur basés sur un ensemble de données de variables aussi réduit auront probablement moins de variables indépendantes, ce qui serait particulièrement le cas pour un modèle empilé basé sur SVM.